59 research outputs found
Risk-reducing design and operations toolkit: 90 strategies for managing risk and uncertainty in decision problems
Uncertainty is a pervasive challenge in decision analysis, and decision
theory recognizes two classes of solutions: probabilistic models and cognitive
heuristics. However, engineers, public planners and other decision-makers
instead use a third class of strategies that could be called RDOT
(Risk-reducing Design and Operations Toolkit). These include incorporating
robustness into designs, contingency planning, and others that do not fall into
the categories of probabilistic models or cognitive heuristics. Moreover,
identical strategies appear in several domains and disciplines, pointing to an
important shared toolkit.
The focus of this paper is to develop a catalog of such strategies and
develop a framework for them. The paper finds more than 90 examples of such
strategies falling into six broad categories and argues that they provide an
efficient response to decision problems that are seemingly intractable due to
high uncertainty. It then proposes a framework to incorporate them into
decision theory using multi-objective optimization.
Overall, RDOT represents an overlooked class of responses to uncertainty.
Because RDOT strategies do not depend on accurate forecasting or estimation,
they could be applied fruitfully to certain decision problems affected by high
uncertainty and make them much more tractable
On solving decision and risk management problems subject to uncertainty
Uncertainty is a pervasive challenge in decision and risk management and it
is usually studied by quantification and modeling. Interestingly, engineers and
other decision makers usually manage uncertainty with strategies such as
incorporating robustness, or by employing decision heuristics. The focus of
this paper is then to develop a systematic understanding of such strategies,
determine their range of application, and develop a framework to better employ
them.
Based on a review of a dataset of 100 decision problems, this paper found
that many decision problems have pivotal properties, i.e. properties that
enable solution strategies, and finds 14 such properties. Therefore, an analyst
can first find these properties in a given problem, and then utilize the
strategies they enable. Multi-objective optimization methods could be used to
make investment decisions quantitatively. The analytical complexity of decision
problems can also be scored by evaluating how many of the pivotal properties
are available. Overall, we find that in the light of pivotal properties,
complex problems under uncertainty frequently appear surprisingly tractable.Comment: 12 page
Optimal Interdiction of Unreactive Markovian Evaders
The interdiction problem arises in a variety of areas including military
logistics, infectious disease control, and counter-terrorism. In the typical
formulation of network interdiction, the task of the interdictor is to find a
set of edges in a weighted network such that the removal of those edges would
maximally increase the cost to an evader of traveling on a path through the
network.
Our work is motivated by cases in which the evader has incomplete information
about the network or lacks planning time or computational power, e.g. when
authorities set up roadblocks to catch bank robbers, the criminals do not know
all the roadblock locations or the best path to use for their escape.
We introduce a model of network interdiction in which the motion of one or
more evaders is described by Markov processes and the evaders are assumed not
to react to interdiction decisions. The interdiction objective is to find an
edge set of size B, that maximizes the probability of capturing the evaders.
We prove that similar to the standard least-cost formulation for
deterministic motion this interdiction problem is also NP-hard. But unlike that
problem our interdiction problem is submodular and the optimal solution can be
approximated within 1-1/e using a greedy algorithm. Additionally, we exploit
submodularity through a priority evaluation strategy that eliminates the linear
complexity scaling in the number of network edges and speeds up the solution by
orders of magnitude. Taken together the results bring closer the goal of
finding realistic solutions to the interdiction problem on global-scale
networks.Comment: Accepted at the Sixth International Conference on integration of AI
and OR Techniques in Constraint Programming for Combinatorial Optimization
Problems (CPAIOR 2009
Generating realistic scaled complex networks
Research on generative models is a central project in the emerging field of
network science, and it studies how statistical patterns found in real networks
could be generated by formal rules. Output from these generative models is then
the basis for designing and evaluating computational methods on networks, and
for verification and simulation studies. During the last two decades, a variety
of models has been proposed with an ultimate goal of achieving comprehensive
realism for the generated networks. In this study, we (a) introduce a new
generator, termed ReCoN; (b) explore how ReCoN and some existing models can be
fitted to an original network to produce a structurally similar replica, (c)
use ReCoN to produce networks much larger than the original exemplar, and
finally (d) discuss open problems and promising research directions. In a
comparative experimental study, we find that ReCoN is often superior to many
other state-of-the-art network generation methods. We argue that ReCoN is a
scalable and effective tool for modeling a given network while preserving
important properties at both micro- and macroscopic scales, and for scaling the
exemplar data by orders of magnitude in size.Comment: 26 pages, 13 figures, extended version, a preliminary version of the
paper was presented at the 5th International Workshop on Complex Networks and
their Application
- …